Joint Learning-based Heterogeneous Graph Attention Network for Timeline Summarization

نویسندگان

چکیده

Timeline summarization (TLS) is defined as a task for summarizing events in chronological order, which gives readers comprehensive understanding of an evolutionary story. Previous studies on the timeline ignored information interaction between sentences and dates, adopted pre-defined unlearnable representations them, significantly degrade performance. They also considered date selection event detection two independent tasks, makes it impossible to integrate their advantages obtain globally optimal summary. In this paper, we present {joint learning-based heterogeneous graph attention network TLS (HeterTls), are combined into unified framework improve extraction accuracy remove redundant simultaneously. Our involves multiple types nodes, iteratively learned across layer. We evaluated our model four datasets, found that outperformed current state-of-the-art baselines with regard ROUGE scores metrics.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Joint Graphical Models for Date Selection in Timeline Summarization

Automatic timeline summarization (TLS) generates precise, dated overviews over (often prolonged) events, such as wars or economic crises. One subtask of TLS selects the most important dates for an event within a certain time frame. Date selection has up to now been handled via supervised machine learning approaches that estimate the importance of each date separately, using features such as the...

متن کامل

Attention-based Graph Neural Network for Semi-supervised Learning

Recently popularized graph neural networks achieve the state-of-the-art accuracy on a number of standard benchmark datasets for graph-based semi-supervised learning, improving significantly over existing approaches. These architectures alternate between a propagation layer that aggregates the hidden states of the local neighborhood and a fully-connected layer. Perhaps surprisingly, we show that...

متن کامل

Graph Hybrid Summarization

One solution to process and analysis of massive graphs is summarization. Generating a high quality summary is the main challenge of graph summarization. In the aims of generating a summary with a better quality for a given attributed graph, both structural and attribute similarities must be considered. There are two measures named density and entropy to evaluate the quality of structural and at...

متن کامل

AttSum: Joint Learning of Focusing and Summarization with Neural Attention

Query relevance ranking and sentence saliency ranking are the two main tasks in extractive query-focused summarization. Previous supervised summarization systems often perform the two tasks in isolation. However, since reference summaries are the trade-off between relevance and saliency, using them as supervision, neither of the two rankers could be trained well. This paper proposes a novel sum...

متن کامل

Improving ROUGE for Timeline Summarization

Current evaluation metrics for timeline summarization either ignore the temporal aspect of the task or require strict date matching. We introduce variants of ROUGE that allow alignment of daily summaries via temporal distance or semantic similarity. We argue for the suitability of these variants in a theoretical analysis and demonstrate it in a battery of task-specific tests.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Shizen gengo shori

سال: 2023

ISSN: ['1340-7619', '2185-8314']

DOI: https://doi.org/10.5715/jnlp.30.184